Simple Stopping Criteria for Information Theoretic Feature Selection

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stopping Criteria for Ensemble-Based Feature Selection

Selecting the optimal number of features in a classifier ensemble normally requires a validation set or cross-validation techniques. In this paper, feature ranking is combined with Recursive Feature Elimination (RFE), which is an effective technique for eliminating irrelevant features when the feature dimension is large. Stopping criteria are based on out-of-bootstrap (OOB) estimate and class s...

متن کامل

Information-theoretic Criteria for Unit Selection Synthesis1

In our recent work on concatenative speech synthesis, we have devised an efficient, graph-based search to perform unit selection given symbolic information. By encapsulating concatenation and substitution costs defined at the class level, the graph expands only linearly with respect to corpus size. To date, these costs were manually tuned over pre-specified classes, which was a knowledgeintensi...

متن کامل

Information-theoretic algorithm for feature selection

Feature selection is used to improve efficiency of learning algorithms by finding an optimal subset of features. However, most feature selection techniques can handle only certain types of data. Additional limitations of existing methods include intensive computational requirements and inability to identify redundant variables. In this paper, we are presenting a novel, information-theoretic alg...

متن کامل

Mutual Information Criteria for Feature Selection

In many data analysis tasks, one is often confronted with very high dimensional data. The feature selection problem is essentially a combinatorial optimization problem which is computationally expensive. To overcome this problem it is frequently assumed either that features independently influence the class variable or do so only involving pairwise feature interaction. In prior work [18], we ha...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Entropy

سال: 2019

ISSN: 1099-4300

DOI: 10.3390/e21010099